skip to main content


Search for: All records

Creators/Authors contains: "Bhattacharjee, R."

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. We consider k-means clustering in an online setting where each new data point is assigned to its closest cluster center and incurs a loss equal to the squared distance to that center, after which the algorithm is allowed to update its centers. The goal over a data stream X is to achieve a total loss that is not too much larger than L(X, OPT), the best possible loss using k fixed centers in hindsight. We give the first algorithm to achieve polynomial space and time complexity in the online setting. 
    more » « less
  2. We consider k-means clustering in an online setting where each new data point is assigned to its closest cluster center and incurs a loss equal to the squared distance to that center, after which the algorithm is allowed to update its centers. The goal over a data stream X is to achieve a total loss that is not too much larger than the best possible loss using k fixed centers in hindsight. Ours is the first algorithm to achieve polynomial space and time complexity in the online setting. We note that our results have implications to the related streaming setting, where one final clustering is outputted, and the no-substitution setting, where center selections are permanent. We show a general reduction between the no-substitution cost of a blackbox algorithm and its online cost. Finally, we translate our algorithm to the no-substitution setting and streaming settings, and it competes with and can outperform existing work in the areas. 
    more » « less
  3. Dasgupta, S. ; Haghtalab, N. (Ed.)
    We consider a lifelong learning scenario in which a learner faces a neverending and arbitrary stream of facts and has to decide which ones to retain in its limited memory. We introduce a mathematical model based on the online learning framework, in which the learner measures itself against a collection of experts that are also memory-constrained and that reflect different policies for what to remember. Interspersed with the stream of facts are occasional questions, and on each of these the learner incurs a loss if it has not remembered the corresponding fact. Its goal is to do almost as well as the best expert in hindsight, while using roughly the same amount of memory. We identify difficulties with using the multiplicative weights update algorithm in this memory-constrained scenario, and design an alternative scheme whose regret guarantees are close to the best possible. 
    more » « less
  4. We consider the problem of embedding a relation, represented as a directed graph, into Euclidean space. For three types of embeddings motivated by the recent literature on knowledge graphs, we obtain characterizations of which relations they are able to capture, as well as bounds on the minimal dimensionality and precision needed. 
    more » « less
  5. A growing body of research has shown that many classifiers are susceptible to adversarial exam- ples – small strategic modifications to test inputs that lead to misclassification. In this work, we study general non-parametric methods, with a view towards understanding when they are ro- bust to these modifications. We establish general conditions under which non-parametric methods are r-consistent – in the sense that they converge to optimally robust and accurate classifiers in the large sample limit. Concretely, our results show that when data is well-separated, nearest neighbors and kernel clas- sifiers are r-consistent, while histograms are not. For general data distributions, we prove that pre- processing by Adversarial Pruning (Yang et al., 2019) – that makes data well-separated – followed by nearest neighbors or kernel classifiers also leads to r-consistency. 
    more » « less
  6. Excited states of the 64Cu (Z=29,N=35) nucleus have been probed using heavy-ion-induced fusion evaporation reaction and an array of Compton-suppressed Clovers as detection system for the emitted γ rays. More than 50 new transitions have been identified and the level scheme of the nucleus has been established up to an excitation energy Ex∼6 MeV and spin ∼10ℏ. The experimental results have been compared with those from large-basis shell-model calculations that facilitated an understanding of the single-particle configurations underlying the level structure of the nucleus. 
    more » « less